翻訳と辞書
Words near each other
・ "O" Is for Outlaw
・ "O"-Jung.Ban.Hap.
・ "Ode-to-Napoleon" hexachord
・ "Oh Yeah!" Live
・ "Our Contemporary" regional art exhibition (Leningrad, 1975)
・ "P" Is for Peril
・ "Pimpernel" Smith
・ "Polish death camp" controversy
・ "Pro knigi" ("About books")
・ "Prosopa" Greek Television Awards
・ "Pussy Cats" Starring the Walkmen
・ "Q" Is for Quarry
・ "R" Is for Ricochet
・ "R" The King (2016 film)
・ "Rags" Ragland
・ ! (album)
・ ! (disambiguation)
・ !!
・ !!!
・ !!! (album)
・ !!Destroy-Oh-Boy!!
・ !Action Pact!
・ !Arriba! La Pachanga
・ !Hero
・ !Hero (album)
・ !Kung language
・ !Oka Tokat
・ !PAUS3
・ !T.O.O.H.!
・ !Women Art Revolution


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

probably approximately correct learning : ウィキペディア英語版
probably approximately correct learning

In computational learning theory, probably approximately correct learning (PAC learning) is a framework for mathematical analysis of machine learning. It was proposed in 1984 by Leslie Valiant.〔L. Valiant. ''(A theory of the learnable. )'' Communications of the ACM, 27, 1984.〕
In this framework, the learner receives samples and must select a generalization function (called the ''hypothesis'') from a certain class of possible functions. The goal is that, with high probability (the "probably" part), the selected function will have low generalization error (the "approximately correct" part). The learner must be able to learn the concept given any arbitrary approximation ratio, probability of success, or distribution of the samples.
The model was later extended to treat noise (misclassified samples).
An important innovation of the PAC framework is the introduction of computational complexity theory concepts to machine learning. In particular, the learner is expected to find efficient functions (time and space requirements bounded to a polynomial of the example size), and the learner itself must implement an efficient procedure (requiring an example count bounded to a polynomial of the concept size, modified by the approximation and likelihood bounds).
== Definitions and terminology ==

In order to give the definition for something that is PAC-learnable, we first have to introduce some terminology.〔Kearns and Vazirani, pg. 1-12,〕 〔Balas Kausik Natarajan, Machine Learning , A Theoretical Approach, Morgan Kaufmann Publishers, 1991〕
For the following definitions, two examples will be used. The first is the problem of character recognition given an array of n bits encoding a binary-valued image. The other example is the problem of finding an interval that will correctly classify points within the interval as positive and the points outside of the range as negative.
Let X be a set called the ''instance space'' or the encoding of all the samples, and each ''instance'' have length assigned. In the character recognition problem, the instance space is X=\^n. In the interval problem the instance space is X=\mathbb, where \mathbb denotes the set of all real numbers.
A ''concept'' is a subset c \subset X. One concept is the set of all patterns of bits in X=\^n that encode a picture of the letter "P". An example concept from the second example is the set of all of the numbers between \pi/2 and \sqrt. A ''concept class'' C is a set of concepts over X. This could be the set of all subsets of the array of bits that are skeletonized 4-connected (width of the font is 1).
Let EX(c,D) be a procedure that draws an example, x, using a probability distribution D and gives the correct label c(x), that is 1 if x \in c and 0 otherwise.
Say that there is an algorithm A that given access to EX(c,D) and inputs \epsilon and \delta that, with probability of at least 1-\delta, A outputs a hypothesis h \in C that has error less than or equal to \epsilon with examples drawn from X with the distribution D. If there is such an algorithm for
every concept c \in C, for every distribution D over X, and for all 0<\epsilon<1/2 and 0<\delta<1/2 then C is PAC learnable (or ''distribution-free PAC learnable''). We can also say that A is a PAC learning algorithm for C.
An algorithm runs in time t if it draws at most t examples and requires at most t time steps. A concept class is efficiently PAC learnable if it is ''PAC learnable'' by an algorithm that runs in time polynomial in 1/\epsilon, 1/\delta and ''instance'' length.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「probably approximately correct learning」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.